Dimensionality reduction in Hilbert spaces
نویسنده
چکیده
Dimensionality reduction is a generic name for any procedure that takes a complicated object living in a high-dimensional (or possibly even infinite-dimensional) space and approximates it in some sense by a finite-dimensional vector. We are interested in a particular class of dimensionality reduction methods. Consider a data source that generates vectors in some Hilbert space H , which is either infinite-dimensional or has a finite but extremely large dimension (think Rd with the usual Euclidean norm, where d is huge). We will assume that the vectors of interest lie in the unit ball of H ,
منابع مشابه
Dimensionality Reduction for Supervised Learning with Reproducing Kernel Hilbert Spaces
We propose a novel method of dimensionality reduction for supervised learning problems. Given a regression or classification problem in which we wish to predict a response variable Y from an explanatory variable X, we treat the problem of dimensionality reduction as that of finding a low-dimensional “effective subspace” for X which retains the statistical relationship between X and Y . We show ...
متن کاملKernel Dimensionality Reduction Dimensionality Reduction for Supervised Learning with Reproducing Kernel Hilbert Spaces
We propose a novel method of dimensionality reduction for supervised learning problems. Given a regression or classification problem in which we wish to predict a response variable Y from an explanatory variable X , we treat the problem of dimensionality reduction as that of finding a low-dimensional “effective subspace” of X which retains the statistical relationship between X and Y . We show ...
متن کاملUnsupervised Riemannian Clustering of Probability Density Functions
We present an algorithm for grouping families of probability density functions (pdfs). We exploit the fact that under the square-root re-parametrization, the space of pdfs forms a Riemannian manifold, namely the unit Hilbert sphere. An immediate consequence of this re-parametrization is that different families of pdfs form different submanifolds of the unit Hilbert sphere. Therefore, the proble...
متن کاملStatistical Dimension Reduction in Non-Linear Manifolds for Brain Shape Analysis, Connectomics & Brain-Computer Interfaces
In many applications, data belong to non-linear sub-manifolds of a high dimensional space. The natural invariance properties of the space in which the data live often encode informative priors that turn out to be key features to improve the results of analyses. This is the case in Computational Anatomy (CA), Brain Computer Interfaces (BCI) and Brain Connectomics where data naturally belong to s...
متن کاملKernel Dimensionality Reduction for Supervised Learning
We propose a novel method of dimensionality reduction for supervised learning. Given a regression or classification problem in which we wish to predict a variable Y from an explanatory vector X , we treat the problem of dimensionality reduction as that of finding a low-dimensional “effective subspace” of X which retains the statistical relationship between X and Y . We show that this problem ca...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2013